DE eng

Search in the Catalogues and Directories

Page: 1 2
Hits 1 – 20 of 35

1
Overview of LifeCLEF 2021: an evaluation of Machine-Learning based Species Identification and Species Distribution Prediction
In: Experimental IR Meets Multilinguality, Multimodality, and Interaction ; https://hal.inria.fr/hal-03415990 ; K. Selçuk Candan; Bogdan Ionescu; Lorraine Goeuriot; Birger Larsen; Henning Müller; Alexis Joly; Maria Maistro; Florina Piroi; Guglielmo Faggioli; Nicola Ferro. Experimental IR Meets Multilinguality, Multimodality, and Interaction, 12880, Springer International Publishing, pp.371-393, 2021, Lecture Notes in Computer Science, ⟨10.1007/978-3-030-85251-1_24⟩ (2021)
BASE
Show details
2
Detection of Hate Speech Spreaders using Convolutional Neural Networks
Siino Marco, Di Nuovo Elisa, Ilenia Tinnirello, Marco La Cascia. - : CEUR, 2021. : country:DEU, 2021. : place:Aachen, 2021
BASE
Show details
3
Overview of the CLEF eHealth Evaluation Lab 2019
Kelly, Liadh; Suominen, Hanna; Goeuriot, Lorraine. - : Springer Verlag, 2019
BASE
Show details
4
Overview of the CLEF eHealth evaluation lab 2018
Suominen, Hanna; Kelly, Liadh; Goeuriot, Lorraine. - : Springer International Publishing, 2018
BASE
Show details
5
Overview of the CLEF 2016 Social Book Search Lab
Koolen, Marijn; Bogers, Toine; Gäde, Maria. - : Springer, 2016
BASE
Show details
6
The IR Task at the CLEF eHealth evaluation lab 2016: User-centred health information retrieval
BASE
Show details
7
Question Answering over Linked Data (QALD-4)
In: Working Notes for CLEF 2014 Conference ; https://hal.inria.fr/hal-01086472 ; Working Notes for CLEF 2014 Conference, Sep 2014, Sheffield, United Kingdom (2014)
BASE
Show details
8
Information Access Evaluation meets Multilinguality, Multimodality, and Visual
In: http://sigir.org/forum/2012D/p029.pdf (2012)
BASE
Show details
9
Information Access Evaluation meets Multilinguality, Multimodality, and Visual
In: http://wwwhome.cs.utwente.nl/~hiemstra/papers/sigirforum12.pdf (2012)
BASE
Show details
10
GeoCLEF 2008: The CLEF 2008 Cross-Language Geographic Information Retrieval Track Overview
In: http://www.clef-campaign.org/2008/working_notes/GeoCLEF-2008-overview-notebook-paperWNfinal.pdf (2008)
BASE
Show details
11
From CLEF to TrebleCLEF: the Evolution of the Cross-Language Evaluation Forum
In: http://www.mt-archive.info/NTCIR-2008-Ferro.pdf (2008)
BASE
Show details
12
CLEF: Ongoing Activities and Plans for the Future
In: http://www.mt-archive.info/NTCIR-2007-Agosti.pdf (2007)
BASE
Show details
13
University of Padua at CLEF 2002: Experiments to evaluate a statistical stemming algorithm
In: http://www.clef-campaign.org/workshop2002/WN/20.pdf
BASE
Show details
14
Performance evaluation. General Terms
In: http://www.clef-campaign.org/2007/working_notes/dinunzioCLEF2007.pdf
BASE
Show details
15
From CLEF to TrebleCLEF: promoting Technology Transfer for Multilingual Information Retrieval
In: http://dis.shef.ac.uk/mark/publications/my_papers/DELOS_CLEFtoTrebleCLEF_20071116-d.pdf
BASE
Show details
16
Xie: GeoCLEF 2007: the CLEF 2007 Cross-Language Geographic Infor¬ma¬tion Retrie¬val Track Overview
In: http://www.clef-campaign.org/2007/working_notes/mandlCLEF2007_Geo_Overview.pdf
BASE
Show details
17
Xie: GeoCLEF 2007: the CLEF 2007 Cross-Language Geographic Infor¬ma¬tion Retrie¬val Track Overview
In: http://www.linguateca.pt/Diana/download/MandletalGeoCLEF2007WN.pdf
Abstract: GeoCLEF ran as a regular track for the second time within the Cross Language Evaluation Forum (CLEF) 2007. The purpose of GeoCLEF is to test and evaluate cross-language geographic information retrieval (GIR): retrieval for topics with a geographic specification. GeoCLEF 2007 consisted of two sub tasks. A search task ran for the third time and a query classification task was organized for the first. For the GeoCLEF 2007 search task, twenty-five search topics were defined by the organizing groups for searching English, German, Portuguese and Spanish document collections. Topics were translated into English, German and Spanish. Several topics in 2007 were geographically challenging. Thirteen groups submitted 108 runs. The groups used a variety of approaches. For the classification task, a query log from a search engine was provided and the groups needed to identify the queries with a geographic scope and the geographic components within the local queries.
Keyword: Evaluation Standards; Experimentation Keywords Multilingual Information Retrieval; Geographic Information Retrieval; H.3.4 Systems and Software General Terms Measurement; Performance; Retrieval
URL: http://www.linguateca.pt/Diana/download/MandletalGeoCLEF2007WN.pdf
http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.81.4254
BASE
Hide details
18
REPORT CLEF 15th Birthday: Past, Present, and Future
In: http://sigir.org/files/forum/2014D/p031.pdf
BASE
Show details
19
Performance evaluation. General Terms
In: http://www.clef-campaign.org/2006/working_notes/workingnotes2006/dinunzioOCLEF2006.pdf
BASE
Show details
20
Additional Keywords and Phrases
In: http://www.clef-campaign.org/2005/working_notes/workingnotes2005/dinunzio05.pdf
BASE
Show details

Page: 1 2

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
35
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern